Skip to content

Conversation

@roomote
Copy link
Contributor

@roomote roomote bot commented Sep 1, 2025

This PR attempts to address Issue #7573 where users behind corporate proxies experience connection errors with self-hosted vLLM OpenAI-compatible APIs after upgrading from version 3.20.3.

Problem

After version 3.20.3, the OpenAI SDK (v5+) requires explicit proxy configuration when working behind corporate proxies. The application was not passing proxy configuration to the OpenAI client, causing connection failures.

Solution

This PR adds comprehensive proxy support by:

1. New Proxy Configuration Utility

  • Created src/api/providers/utils/proxy-config.ts that:
    • Respects standard proxy environment variables (HTTP_PROXY, HTTPS_PROXY, NO_PROXY)
    • Supports both uppercase and lowercase variants
    • Handles NO_PROXY bypass patterns including wildcards (e.g., *.internal.com)
    • Validates proxy URLs before creating agents
    • Gracefully handles missing https-proxy-agent module

2. Integration with OpenAI Providers

  • Updated BaseOpenAiCompatibleProvider to use proxy configuration
  • Updated OpenAiHandler to apply proxy settings for all client types:
    • Standard OpenAI
    • Azure OpenAI
    • Azure AI Inference

3. Comprehensive Testing

  • Added 17 test cases covering:
    • Basic proxy configuration
    • NO_PROXY bypass patterns
    • Case sensitivity handling
    • Error scenarios
    • Invalid URL handling

Testing

  • All existing tests pass ✅
  • New proxy configuration tests pass ✅
  • ESLint and TypeScript checks pass ✅

Impact

This fix enables users behind corporate proxies to connect to their self-hosted vLLM and other OpenAI-compatible APIs, restoring functionality that was working in v3.20.3.

Fixes #7573

Feedback and guidance are welcome!


Important

Adds proxy support for OpenAI-compatible providers, restoring connectivity for users behind corporate proxies.

  • Behavior:
    • Adds proxy support for OpenAI-compatible providers by introducing getProxyConfig in proxy-config.ts.
    • Integrates proxy configuration into BaseOpenAiCompatibleProvider and OpenAiHandler.
  • Utility:
    • getProxyConfig respects HTTP_PROXY, HTTPS_PROXY, NO_PROXY environment variables, supports case sensitivity, and handles invalid URLs.
    • Validates proxy URLs and uses https-proxy-agent if available.
  • Testing:
    • Adds 17 test cases in proxy-config.spec.ts for proxy configuration, NO_PROXY patterns, case sensitivity, and error handling.
  • Impact:
    • Restores connectivity for users behind corporate proxies using self-hosted vLLM and OpenAI-compatible APIs.

This description was created by Ellipsis for dc42ec0. You can customize this summary. It will automatically update as commits are pushed.

- Add proxy configuration utility that respects HTTP_PROXY, HTTPS_PROXY, and NO_PROXY environment variables
- Update BaseOpenAiCompatibleProvider to use proxy configuration when available
- Update OpenAiHandler to use proxy configuration for all OpenAI client variants (standard, Azure, Azure AI Inference)
- Add comprehensive tests for proxy configuration handling

Fixes #7573 - API connection errors with self-hosted vLLM behind corporate proxy
@roomote roomote bot requested review from cte, jr and mrubens as code owners September 1, 2025 10:32
@dosubot dosubot bot added size:L This PR changes 100-499 lines, ignoring generated files. bug Something isn't working labels Sep 1, 2025
Copy link
Contributor Author

@roomote roomote bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I wrote this code and even I can see it's only half-baked.

Critical Issues

1. Incomplete proxy support across providers
The PR only adds proxy configuration to BaseOpenAiCompatibleProvider and OpenAiHandler, but several other providers that create OpenAI clients directly are missing proxy support:

  • ollama.ts (line 37-42) - Creates OpenAI client without proxy config
  • lm-studio.ts (line 27-31) - Creates OpenAI client without proxy config
  • openai-native.ts (line 63) - Creates OpenAI client without proxy config
  • openrouter.ts (line 96) - Creates OpenAI client without proxy config
  • huggingface.ts (line 25-29) - Creates OpenAI client without proxy config

Without adding proxy support to these providers, users behind corporate proxies still won't be able to connect to these services.

2. Missing dependency declaration
The https-proxy-agent package is used via require() but I don't see it declared in package.json dependencies. This will cause runtime errors if the package isn't installed.

Important Suggestions

3. Dynamic require usage
Using require() for dynamic imports (line 11 in proxy-config.ts) could cause bundling issues. Consider using dynamic import() or ensuring the module is properly handled by the bundler.

4. Console.warn in production
The warning messages (lines 14 and 57 in proxy-config.ts) should use a proper logging mechanism rather than console.warn.

Minor Improvements

5. Test coverage for edge cases
While the tests are comprehensive, consider adding tests for:

  • Proxy URLs with authentication (e.g., http://user:[email protected]:8080)
  • IPv6 addresses in NO_PROXY
  • Multiple proxy URLs in sequence

6. Documentation
Consider adding JSDoc comments to explain the proxy configuration behavior and environment variable precedence.

Summary

This PR makes a good start at addressing the proxy issue, but needs to be extended to all providers that create OpenAI clients. The missing dependency also needs to be added to package.json.

// Dynamic import to avoid bundling issues
let HttpsProxyAgent: any
try {
HttpsProxyAgent = require("https-proxy-agent").HttpsProxyAgent
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is using require() intentional here? Dynamic requires can cause issues with bundlers. Consider using dynamic import or ensuring the bundler handles this correctly. Also, is https-proxy-agent listed in package.json dependencies?

HttpsProxyAgent = require("https-proxy-agent").HttpsProxyAgent
} catch (error) {
// If the module is not available, return undefined
console.warn("https-proxy-agent module not available, proxy support disabled")
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Consider using a proper logging mechanism instead of console.warn. In production, these warnings might not be visible or could clutter logs.

@hannesrudolph hannesrudolph added the Issue/PR - Triage New issue. Needs quick review to confirm validity and assign labels. label Sep 1, 2025
@daniel-lxs daniel-lxs moved this from Triage to PR [Needs Prelim Review] in Roo Code Roadmap Sep 2, 2025
@hannesrudolph hannesrudolph added PR - Needs Preliminary Review and removed Issue/PR - Triage New issue. Needs quick review to confirm validity and assign labels. labels Sep 2, 2025
@daniel-lxs
Copy link
Member

Closing this PR for now. See #7573 (comment) for discussion on the root cause and next steps. We need more information to design a comprehensive solution that covers all providers properly.

@daniel-lxs daniel-lxs closed this Sep 2, 2025
@github-project-automation github-project-automation bot moved this from New to Done in Roo Code Roadmap Sep 2, 2025
@github-project-automation github-project-automation bot moved this from PR [Needs Prelim Review] to Done in Roo Code Roadmap Sep 2, 2025
@daniel-lxs daniel-lxs deleted the fix/openai-compatible-proxy-support branch September 2, 2025 20:26
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

bug Something isn't working PR - Needs Preliminary Review size:L This PR changes 100-499 lines, ignoring generated files.

Projects

Archived in project

Development

Successfully merging this pull request may close these issues.

API Request Failed after upgrading from v3.20.3 to v3.26.3

4 participants